Connecting Language and Knowledge Bases with Embedding Models for Relation Extraction
نویسندگان
چکیده
This paper proposes a novel approach for relation extraction from free text which is trained to jointly use information from the text and from existing knowledge. Our model is based on scoring functions that operate by learning low-dimensional embeddings of words, entities and relationships from a knowledge base. We empirically show on New York Times articles aligned with Freebase relations that our approach is able to efficiently use the extra information provided by a large subset of Freebase data (4M entities, 23k relationships) to improve over methods that rely on text features alone.
منابع مشابه
Reading and Reasoning with Knowledge Graphs
Much attention has recently been given to the creation of large knowledge bases that contain millions of facts about people, things, and places in the world. These knowledge bases have proven to be incredibly useful for enriching search results, answering factoid questions, and training semantic parsers and relation extractors. The way the knowledge base is actually used in these systems, howev...
متن کاملJointly Embedding Relations and Mentions for Knowledge Population
This paper contributes a joint embedding model for predicting relations between a pair of entities in the scenario of relation inference. It differs from most standalone approaches which separately operate on either knowledge bases or free texts. The proposed model simultaneously learns low-dimensional vector representations for both triplets in knowledge repositories and the mentions of relati...
متن کاملFrom Strings to Things SAR-Graphs: A New Type of Resource for Connecting Knowledge and Language
Recent research and development have created the necessary ingredients for a major push in web-scale language understanding: large repositories of structured knowledge (DBpedia, the Google knowledge graph, Freebase, YAGO) progress in language processing (parsing, information extraction, computational semantics), linguistic knowledge resources (Treebanks, WordNet, BabelNet, UWN) and new powerful...
متن کاملLearning Knowledge Graph Embeddings for Natural Language Processing
Knowledge graph embeddings provide powerful latent semantic representation for the structured knowledge in knowledge graphs, which have been introduced recently. Being different from the already widely-used word embeddings that are conceived from plain text, knowledge graph embeddings enable direct explicit relational inferences among entities via simple calculation of embedding vectors. In par...
متن کاملSar-graphs: A Linked Linguistic Knowledge Resource Connecting Facts with Language
We present sar-graphs, a knowledge resource that links semantic relations from factual knowledge graphs to the linguistic patterns with which a language can express instances of these relations. Sar-graphs expand upon existing lexicosemantic resources by modeling syntactic and semantic information at the level of relations, and are hence useful for tasks such as knowledge base population and re...
متن کامل